Hepatology Communications
○ Ovid Technologies (Wolters Kluwer Health)
Preprints posted in the last 7 days, ranked by how well they match Hepatology Communications's content profile, based on 21 papers previously published here. The average preprint has a 0.02% match score for this journal, so anything above that is already an above-average fit.
Xie, R.; Schöttker, B.
Show abstract
Background & AimsClonal hematopoiesis of indeterminate potential (CHIP) has been linked to chronic liver disease progression, yet its role across the full spectrum of metabolic dysfunction-associated steatotic liver disease (MASLD), from its initial development to end-stage complications, remains unclear. We aimed to comprehensively investigate the association of CHIP and its major subtypes with both the incidence and progression of MASLD. MethodsWe conducted a prospective cohort study of 353,218 UK Biobank participants, stratified into a healthy cohort free of MASLD at baseline (Cohort 1; n=230,270) and a prevalent MASLD cohort (Cohort 2; n=122,948). CHIP was ascertained from whole-exome sequencing data. We used multivariable Cox regression, competing risk models, and mediation analyses to assess the associations of CHIP (overall, by driver gene, and by clone size) with incident MASLD, cirrhosis, hepatocellular carcinoma (HCC), and liver-related death. ResultsIn Cohort 1, CHIP was associated with an increased risk of incident MASLD (HR 1.25, 95% CI 1.08-1.44) and cirrhosis (HR 1.57, 95% CI 1.10-2.25). These associations were driven by non-DNMT3A mutations, particularly TET2, and showed a linear dose-response relationship with clone size. In Cohort 2, non-DNMT3A CHIP was associated with progression to cirrhosis (HR 1.82, 95% CI 1.28-2.58). The associations were more pronounced in males and in individuals without obesity or diabetes. C-reactive protein partially mediated the CHIP-MASLD association. ConclusionCHIP, driven predominantly by non-DNMT3A mutations (particularly TET2) is an independent risk factor for both the development and progression of MASLD. These findings position CHIP as a novel player in the pathophysiology of MASLD and suggest potential avenues for risk stratification and targeted anti-inflammatory intervention. Impact and ImplicationsThis large-scale, prospective study establishes clonal hematopoiesis of indeterminate potential (CHIP) as a novel and independent risk factor for the entire spectrum of metabolic dysfunction-associated steatotic liver disease (MASLD), from its initial development to its progression to cirrhosis and liver-related death. For hepatologists and hematologists, these findings identify a genetically defined, high-risk subpopulation, particularly individuals with non-DNMT3A mutations, who may benefit from enhanced liver surveillance. The identification of systemic inflammation as a partial mediator of the CHIP-MASLD association suggests that anti-inflammatory therapies currently under development for liver disease could represent a targeted treatment strategy for this growing patient population.
Herrera, L.; Meneses, M. J.; Ribeiro, R. T.; Gardete-Correia, L.; Raposo, J. F.; Boavida, J. M.; Penha-Goncalves, C.; Macedo, M. P.
Show abstract
Background & AimsMetabolic disorders such as dyslipidemia, metabolic dysfunction-associated steatotic liver disease (MASLD), and diabetes are promoted by chronic pro-inflammatory and pro-oxidative states. Paraoxonase 1 (PON1), a liver-derived HDL-associated enzyme, plays an important antioxidant role by hydrolyzing oxidized lipids and protecting against oxidative stress- induced damage. Genetic variation in PON1, particularly in promoter and coding regions, modulates enzyme expression and activity, thereby influencing susceptibility to metabolic and cardiovascular diseases. This study investigated the genetic determinants of serum paraoxonase (PONase) activity and their relationship with dysmetabolic phenotypes. MethodsA genome-wide association study was conducted in 922 Portuguese individuals from the PREVADIAB2 cohort. Genetic variants and haplotypes related to PONase activity were analyzed, and associations with dysglycemia and liver fibrosis were evaluated in individuals aged over 55 years. ResultsWe identified two key PON1 variants as determinants of PONase activity: rs2057681 (in strong linkage disequilibrium with the non-synonymous Q192R variant) and rs854572 (located in the promoter region). Analysis of rs854572-rs2057681 haplotypes revealed that specific combinations differentially modulate PONase activity and confer risk or protection for dysglycemia and liver fibrosis, depending on the rs2057681 genotype context. Notably, although PONase activity was strongly associated with PON1 variants, it did not directly correlate with dysmetabolic phenotypes, suggesting that genetic context and haplotype structure, rather than enzyme activity alone, shape disease susceptibility. ConclusionsThese findings highlight the complex genetic architecture of PON1 and its role in metabolic disease risk, supporting the use of PON1 genetic information to uncover predisposition to dysmetabolic conditions. Our results provide insights into the interplay between PON1 genetics, enzyme function, and dysmetabolism, with implications for risk stratification in metabolic liver disease. Lay SummaryPON1 is a liver-derived gene that encodes an enzyme involved in protection against oxidative stress, a key contributor to metabolic liver disease and diabetes. In this study, we found that specific combinations of PON1 genetic variants are associated with abnormalities in blood glucose regulation and with markers of liver fibrosis. These associations were dependent on genetic configuration rather than enzyme activity alone, suggesting that PON1 genetic information may help identify individuals at higher risk of metabolic liver disease.
Haeusler, I. L.; Etoori, D.; Campbell, C. N. J.; McDonald, S. L. R.; Lopez Bernal, J.; Mounier-Jack, S.; Kasstan-Dabush, B.; McDonald, H. I.; Parker, E. P. K.; Suffel, A.
Show abstract
BackgroundIn England, individuals with chronic liver disease (CLD) are among those with the lowest seasonal influenza vaccine uptake despite being at elevated risk of severe influenza. We examined the relationship between CLD severity and aetiology, and influenza vaccine uptake in England. MethodsA retrospective cohort study of adults (18-115 years) using Clinical Practice Research Datalink Aurum primary care data was conducted for five seasons (2019/20-2023/24). Poisson regression was used to estimate rates of uptake by CLD severity (clinical diagnoses categorised as low, moderate, or severe) and aetiology (alcohol-related, viral-related, and diagnoses in the Green Book guidelines). FindingsThere were 182,174-277,470 with CLD per cohort. Among those who were additionally age-eligible for vaccination, uptake was 71{middle dot}1-79{middle dot}7% compared to 30{middle dot}9-40{middle dot}5% in those not additionally age-eligible. Among individuals below age eligibility without other comorbidities, severity was associated with higher uptake (incidence rate ratio [IRR] moderate 1{middle dot}80, 95% CI 1{middle dot}69-1{middle dot}90; severe 1{middle dot}95, 95% CI 1{middle dot}84-2{middle dot}08 in 2023/24); there was no effect in those with at least one additional comorbidity (moderate 1{middle dot}05, 95% CI 0{middle dot}99-1{middle dot}10; severe 1{middle dot}05, 95% CI 1{middle dot}01-1{middle dot}09). Alcohol- and viral-related aetiology were also associated with increased uptake in those not additionally age-eligible. Among individuals meeting age eligibility without additional comorbidities, severity was associated with a reduced uptake (moderate 0{middle dot}81, 95% CI 0{middle dot}73-0{middle dot}90; severe 0{middle dot}79, 95% CI 0{middle dot}74-0{middle dot}85), with attenuation in those with additional comorbidities (moderate 0{middle dot}99, 95% CI 0{middle dot}94-1{middle dot}04; severe 0{middle dot}91, 95% CI 0{middle dot}89-0{middle dot}94). InterpretationCLD severity and aetiology were important determinants of uptake in the absence of additional indications for influenza vaccination. Future research should prioritise understanding facilitators and barriers to vaccine uptake in individuals with CLD, particularly for those at highest risk of severe infection. FundingNIHR Health Protection Research Unit in Vaccines and Immunisation (NIHR200929/NIHR207408). Research in contextO_ST_ABSEvidence before this studyC_ST_ABSWe searched PubMed up to June 2025 using the terms "chronic liver disease", "cirrhosis", "hepatitis", "influenza vaccination", "seasonal influenza", and "vaccine uptake". Previous research, including national data from England, has shown that people with chronic liver disease tend to have lower seasonal influenza vaccine uptake than individuals with other medical comorbidities which qualify for vaccination such as diabetes, chronic kidney disease or immunosuppression. The reasons for low influenza vaccine uptake in people with chronic liver disease are not well understood, and it is therefore difficult for vaccination providers, principally primary care services in England, to tailor interventions aimed to increase uptake. Qualitative research involving individuals aged less than 65 years living in England with clinical risk comorbidities, most commonly diabetes, found that chronic disease management pathways inconsistently provided information about the importance of influenza vaccination as part of chronic disease management. Individuals with long-term conditions reported low perceived risk of influenza infection and limited awareness of vaccine benefits as important reasons for non-uptake. We hypothesised that the severity and aetiology of chronic liver disease may be important determinants of uptake. Added value of this studyWe conducted a population-based study to examine how chronic liver disease severity and aetiology influence seasonal influenza vaccine uptake in adults in England. Using primary care electronic health record data from five consecutive influenza seasons (2019/20-2023/24), we found that more severe chronic liver disease was associated with a substantial increase in vaccine uptake in those without additional indications for seasonal influenza vaccination (age-based eligibility or other qualifying clinical risk comorbidities). Alcohol- and viral-related aetiology were also associated with increased uptake in those who were not additionally age-eligible for vaccination. In contrast, severity, alcohol- and viral-related underlying aetiology were associated with a modest reduction in uptake for individuals with chronic liver disease who also qualified for vaccination due to age. Implications of all the available evidenceDespite clear clinical vulnerability to infection and a substantially elevated risk of morbidity and mortality following infection, a large proportion of adults with chronic liver disease, particularly those aged under 65 years, remain unvaccinated against seasonal influenza each year. This study suggests that chronic liver disease severity and underlying aetiology are important determinants of uptake in individuals not meeting age-based vaccine eligibility, particularly in those without additional clinical risk comorbidities. This could be because of differing perceptions of influenza risk, or due to varying degrees of interaction with healthcare specialists as part of chronic disease management. In individuals who met age-based vaccination eligibility, the negative effect of severity on influenza vaccine uptake may reflect greater barriers to accessing vaccination services by those with more complex health needs, or competing medical priorities for long-term condition management during consultations. To inform targeted vaccination strategies, future research should aim to understand the specific facilitators and barriers to influenza vaccination experienced by individuals with chronic liver disease. This should include perspectives of individuals with different disease severity, across different age groups, in those with and without additional co-morbidities.
Soundararajan, V.; Venkatakrishnan, A. J.; Murugadoss, K.; K, P.; Varma, G.; Aman, A.
Show abstract
Semaglutide has shown benefit in metabolic dysfunction-associated steatohepatitis (MASH), but real-world evidence across longitudinal liver phenotypes remains limited, particularly regarding how liver remodeling relates to weight loss and dose exposure. Using a de-identified federated electronic health record network spanning more than 29 million patients in the United States, including 489,785 semaglutide-treated adults, we analyzed 6,734 patients with baseline liver disease burden. We find that higher attained pre-landmark (0-2 years) semaglutide dose was associated with lower post-landmark (2-4 years) risk of steatohepatitis, alcoholic liver disease, and all-cause mortality, whereas greater pre-landmark weight loss was associated with lower post-landmark risk of steatohepatitis, steatotic liver disease, and hepatorenal syndrome, indicating distinct dose- and weight-linked patterns of long-term liver benefits. These associations were notable because semaglutide prescribing was generally lower during the post-landmark period, raising the possibility of durable benefit beyond peak exposure. Towards better understanding mechanistic bases for liver protection, we performed a complementary longitudinal study of 326 adults with paired noninvasive liver elastography measurements before and after treatment initiation. Median liver stiffness decreased from 4.85 [3.02 - 7.20] to 3.9 [2.6 - 5.8] kPa after semaglutide initiation (median change = -0.38 kPa; p<0.001), with 194 of 326 patients (59.5%) showing lower follow-up stiffness. A clinically meaningful reduction of at least 20% was observed in 133 of 326 patients (40.8%), and 69 of 326 (21.2%) shifted to a lower fibrosis stage by prespecified elastography thresholds. Larger improvements were also seen in patients with higher baseline stiffness (p<0.001); notably 80% of patients with cirrhosis-range baseline stiffness ([≥]12.5 kPa) achieved [≥]20% improvement versus 29.5% with minimal baseline disease (p <0.001). The proportion achieving at least 20% stiffness improvement was similar across weight-loss strata, including patients with no weight loss or weight gain and those with at least 10% weight loss (38.0% in each group), and liver stiffness change showed negligible correlation with changes in weight, BMI, HBA1c, alanine aminotransferase, or aspartate aminotransferase. To provide biological context, single cell RNA analyses demonstrated sparse overall hepatic GLP1R expression (0.0239%), with enrichment in non-parenchymal niches including cholangiocytes, intrahepatic cholangiocytes, liver sinusoidal endothelial cells, and hepatic stellate cells implicated in fibrogenesis and vascular remodeling. Together, this real-world evidence suggests diverse liver benefits for semaglutide beyond weight-loss with intricate dose response relationships.
Varughese, S.; Huang, M.; Savige, J.
Show abstract
Autosomal dominant polycystic liver disease (ADPLD) commonly results from a pathogenic variant in one of 6 genes (GANAB, ALG8, LRP5, PRKCSH, SEC61B, SEC63). Pathogenic variants in these genes are also associated with kidney cysts, which rarely cause kidney failure, but the genes are included in cystic kidney panels. This study determined the population frequency of predicted pathogenic variants in the ADPLD genes in the general population. Variants for each gene were downloaded from gnomAD and annotated with ANNOVAR. The population frequencies were calculated from the number of people with "predicted pathogenic" variants in gnomAD v.2.1.1:loss-of-function structural and copy number; null; and rare, computationally-damaging missense changes that affected a conserved residue. Frequencies were also estimated from the number of gnomADv.4.1 variants assessed as Pathogenic or Likely pathogenic in ClinVar. Predicted pathogenic variants affected one in 95 people using our strategy and gnomAD v.2.1.1, and one in 151 with ClinVar assessments of gnomAD v.4.1 variants. LRP5 and ALG8 which are associated with a milder clinical phenotype, were the commonest affected genes with both strategies. Predicted pathogenic variants in ADPLD appear more frequent in admixed American (one in 100), Finnish (one in 107) and African/African American (one in 130) people (p all <0.0001 compared with Europeans (one in 197).Predicted pathogenic variants for ADPLD may be even more common because of additional unidentified causative genes. However not all ADPLD variants result in liver cysts, nor indeed cystic kidneys, because of incomplete penetrance and variable expressivity.
Chandra, S.
Show abstract
Background. Pancreatic ductal adenocarcinoma (PDAC) has a five-year survival rate of approximately 12%, largely because it is typically diagnosed at an advanced stage. CT-based computational methods for early detection exist but rely on black-box deep learning or large texture feature sets without tissue-specific interpretability. Methods. We developed Virtual Spectral Decomposition (VSD), which applies six parameterized sigmoid functions S(HU) = 1/(1+exp(-alpha x (HU - mu))) to standard portal-venous CT, decomposing each pixel into tissue-specific response channels for fat (mu=-60), fluid (mu=10), parenchyma (mu=45), stroma (mu=75), vascular (mu=130), and calcification (mu=250). Dendritic Binary Gating identifies structural content per channel using morphological filtering, enabling co-firing analysis and lone firer identification. A 25-feature signature was extracted per patient. Three independent datasets were analyzed: NIH Pancreas-CT (n=78 healthy), Medical Segmentation Decathlon Task07 (n=281 PDAC, paired tumor/adjacent tissue), and CPTAC-PDA from The Cancer Imaging Archive (n=82, multi-institutional, with DICOM time point tags). The same six sigmoid parameters were used across all datasets without retraining. Results. VSD achieved AUC 0.943 for field effect detection (healthy vs cancer-adjacent parenchyma) and AUC 0.931 for patient-stratified tumor specification on MSD. On CPTAC-PDA, VSD achieved AUC 0.961 (6 features) and 0.979 (25 features) for distinguishing healthy from cancer-bearing pancreas on scans obtained prior to pathological diagnosis. All significant features replicated across datasets in the same direction: z_fat (d=-2.10, p=3.5e-27), z_fluid (d=-2.76, p=2.4e-38), fire_fat (d=+2.18, p=1.2e-28). Critically, VSD severity did not correlate with days-from-diagnosis (r=-0.008, p=0.944) across a range of day -1394 to day +249. Patient C3N-01375, scanned 3.8 years before pathological diagnosis, had VSD severity 1.87, well above the healthy mean of 0.94 +/- 0.33. The tissue transformation signature was temporally stable, indicating an early, persistent tissue state rather than a progressively worsening process. Conclusions. VSD with Dendritic Binary Gating detects a stable pancreatic tissue composition signature on standard CT that is present years before clinical diagnosis, validated across three independent datasets without parameter adjustment. The six sigmoid channels map to biologically meaningful tissue components through a fully transparent interpretability chain. The temporal stability of the signal implies a detection window of 3-7 years, consistent with known PanIN-3 microenvironment transformation timelines. VSD functions as a single-scan screening tool applicable to any abdominal CT performed during the pre-clinical window.
Flevaris, K.; Trbojevic-Akmacic, I.; Goh, D.; Lalli, J. S.; Vuckovic, F.; Capin Vilaj, M.; Stambuk, J.; Kristic, J.; Mijakovac, A.; Ventham, N.; Kalla, R.; Latiano, A.; Manetti, N.; Li, D.; McGovern, D. P. B.; Kennedy, N. A.; Annese, V.; Lauc, G.; Satsangi, J.; Kontoravdi, C.
Show abstract
Background and Aims: Alterations in immunoglobulin G (IgG) N-glycosylation are implicated in inflammatory bowel disease (IBD); however, the robustness of IgG glycan signatures across IBD cohorts with diverse demographics and geographic origins remains underexplored. We aimed to determine whether compositional data analysis (CoDA) and machine learning (ML) can identify IBD-related IgG N-glycan signatures and whether these signatures capture disease-associated acceleration of biological aging. Methods: We analyzed the IgG glycome profiles of 1,367 plasma samples collected from healthy controls (HC), symptomatic controls (SC), and people with newly diagnosed Crohn's (CD), and ulcerative colitis (UC) across four cohorts (UK, Italy, United States, and Netherlands). IgG glycosylation was analyzed by ultra-high-performance liquid chromatography, yielding 24 total-area-normalized glycan peaks (GPs). Analyses were performed using cross-sectional data obtained at baseline. CoDA-powered association analyses were used to identify disease-related effects on GPs while controlling for demographic covariates. ML models were trained and evaluated to assess generalizability to unseen cohorts and demographic subgroups, with a focus on discrimination and reliability. Results: Across all cohorts, people with IBD demonstrated accelerated biological aging as quantified by the GlycanAge index. This was accompanied by consistent reductions in IgG galactosylation, with effects partially modulated by age. Classification models trained on glycomics and demographics achieved robust discrimination (AUROC~0.80) between non-IBD (HC+SC) and IBD across cohorts. Conclusion: These findings reveal accelerated biological aging in people with IBD and support the translational potential of IgG glycans as biomarkers and a novel route toward clinically interpretable personalized risk estimates.
Phillips, V.; Woodwal, P.
Show abstract
BackgroundArtificial intelligence and machine learning (AI/ML) are among the fastest-growing domains in NIH research funding, but whether children have shared equitably in this expansion is unknown. We characterized pediatric representation in NIH AI/ML funding from fiscal years (FY) 2020 to 2024. MethodsNIH grant data were obtained from Research Portfolio Online Reporting Tools Expenditures and Results bulk files for FY2020 to FY2024. AI/ML grants were identified using the NIH Research, Condition, and Disease Categorization "Machine Learning and Artificial Intelligence" category, and pediatric grants using the "Pediatric" category. Subprojects were excluded. Grants were deduplicated within each fiscal year by core project number for trend analyses and across all years retaining the most recent fiscal year for cross-sectional totals. Disease areas were identified by keyword searches of titles and abstracts. ResultsAcross FY2020 to FY2024, 5,624 unique NIH AI/ML grants totaling $3,371 million were identified. Of these, 836 grants (14.9%) were classified as pediatric, representing $401 million (11.9%) of total NIH AI/ML funding. Although this share was consistent with the historically reported overall NIH pediatric funding baseline of approximately 10% to 12%, it remained substantially below the US pediatric population share of approximately 22%. The pediatric share of NIH AI/ML funding declined from 12.3% in FY2020 to 10.8% in FY2024, despite growth in absolute pediatric funding. Indexed to FY2020, pediatric AI/ML funding grew approximately 2.6-fold compared with 3.0-fold growth in the total portfolio. Across disease areas, unadjusted adult/general-to-pediatric funding ratios ranged from 2.0-fold in mental health to 9.8-fold in cancer. ConclusionsPediatric representation in NIH AI/ML funding remained low and declined over time as the overall portfolio expanded. These findings suggest that growth in NIH AI/ML investment has not been matched by proportional gains for pediatric research.
Lahtinen, E.; Schigiltchoff, N.; Jia, K.; Kundrot, S.; Palchuk, M. B.; Warnick, J.; Chan, L.; Shigiltchoff, N.; Sawhney, M. S.; Rinard, M.; Appelbaum, L.
Show abstract
Background and aims: Pancreatic ductal adenocarcinoma (PDAC) surveillance is limited to individuals with familial or genetic risk although most future cases arise outside these groups. In a retrospective study, PRISM, an electronic health record (EHR)-based PDAC risk model, identified individuals in the general population at elevated near-term risk of PDAC. We aimed to prospectively evaluate whether PRISM can identify high-risk individuals beyond current surveillance groups across U.S. health systems. Methods: We performed a prospective multicenter cohort study after deployment of PRISM in April 2023 across 44 U.S. health care organizations. Eligible adults aged [≥]40 years without prior PDAC received a single baseline risk score and were assigned to prespecified risk tiers. Patients were followed for incident PDAC for 30 months. We estimated tier-specific 30-month cumulative incidence (positive predictive value, PPV), number needed to screen (NNS), standardized incidence ratios (SIRs), and time from deployment and first high-risk flag to diagnosis. Results: Among 6,282,123 adults assigned a PRISM score, 5,058,067 had follow-up; 3,609 developed PDAC. The highest-risk tier had 30-fold higher PDAC incidence than the study population. At the SIR 5 threshold, 30-month cumulative incidence was 0.35% (NNS, 284.2); at SIR 16, 1.14% (NNS, 87.4); and at SIR 30, 2.19% (NNS, 45.7). Median time from deployment to PDAC diagnosis was 9.5 months, and median time from first high-risk flag to diagnosis at SIR 5 was 3.5 years. Shapley additive explanations (SHAP) analyses supported patient- and tier-level interpretability. Conclusions: Prospective deployment of PRISM across multiple U.S. health care organizations identified individuals at elevated near-term risk for PDAC, with substantial risk enrichment and lead time before diagnosis. These findings support the real-world scalability and generalizability of EHRbased risk stratification for risk-adapted early detection. ClinicalTrials.gov identifier NCT05973331
Xu, M.; Philips, R.; Singavarapu, A.; Zheng, M.; Martin, D.; Nikolin, S.; Mutz, J.; Becker, A.; Firenze, R.; Tsai, L.-H.
Show abstract
Background: Gamma oscillation dysfunction has been implicated in neuropsychiatric disorders. Restoring gamma oscillations via brain stimulation represents an emerging therapeutic approach. However, the strength of its clinical effects and treatment moderators remain unclear. Method: We conducted a systematic review and meta-analysis to examine the clinical effects of gamma neuromodulation in neuropsychiatric disorders. A literature search for controlled trials using gamma stimulation was performed across five databases up until April 2025. Effect sizes were calculated using Hedge's g. Separate analyses using the random-effects model examined the clinical effects in schizophrenia (SZ), major depressive disorder (MDD), bipolar disorder, and autism spectrum disorder. For SZ and MDD, subgroup analyses evaluated the effects of stimulation modality, stimulation frequency, treatment duration, and pulses per session. Result: Fifty-six studies met the inclusion criteria (NSZ = 943, NMDD = 916, NBD = 175, NASD = 232). In SZ, gamma stimulation was associated with improvements in positive (k = 10, g = -0.60, p < 0.001), negative (k = 12, g = -0.37, p = 0.03), depressive (k = 8, g = -0.39, p < 0.001), anxious symptoms (k = 5, g = -0.59, p < 0.001), and overall cognitive function (k = 7, g = 0.55, p < 0.001). Stimulation frequency and treatment duration moderated therapeutic effects. In MDD, reductions in depressive symptoms were observed (k = 23, g = -0.34, p = 0.007). Conclusion: Gamma neuromodulation showed moderate therapeutic benefits in SZ and MDD. Substantial heterogeneity likely reflects protocol differences, highlighting the need for well-powered future trials.
Quide, Y.; Lim, T. E.; Gustin, S. M.
Show abstract
BackgroundEarly-life adversity (ELA) is a risk factor for enduring pain in youth and is associated with alterations in brain morphology and function. However, it remains unclear whether ELA-related neurobiological changes contribute to the development of enduring pain in early adolescence. MethodsUsing data from the Adolescent Brain Cognitive Development (ABCD) Study, we examined multimodal magnetic resonance imaging (MRI) markers in children assessed at baseline (ages 9-11 years) and at 2-year follow-up (ages 11-13 years). ELA exposure was defined at baseline to maximise temporal separation between early adversity and later enduring pain. Participants with enduring pain at follow-up (n = 322) were compared to matched pain-free controls (n = 644). Structural MRI, diffusion MRI (fractional anisotropy, mean diffusivity), and resting-state functional connectivity data were analysed. Linear models tested main effects of enduring pain, ELA, and their interaction on brain metrics, controlling for relevant covariates. ResultsELA exposure was associated with smaller caudate and nucleus accumbens volumes, and reduced surface area of the left rostral middle frontal gyrus. No significant effects of enduring pain or ELA-by-enduring pain interaction were observed across grey matter, white matter, or functional connectivity measures. ConclusionsELA was associated with alterations in fronto-striatal regions in late childhood, but these changes were not linked to enduring pain in early adolescence. These findings suggest that ELA-related neurobiological alterations may represent early markers of vulnerability rather than concurrent correlates of enduring pain. Longitudinal follow-up is needed to determine whether these alterations contribute to later chronic pain risk.
Spann, D. J.; Hall, L. M.; Moussa-Tooks, A.; Sheffield, J. M.
Show abstract
BackgroundNegative symptoms are core features of schizophrenia that relate strongly to functional impairment, yet interventions targeting these symptoms remain largely ineffective. Emerging theoretical work highlights how environmental factors may shape and maintain negative symptoms. Although racial disparities in schizophrenia diagnosis among Black Americans are well documented and linked to racial stress and psychosis, the impact of racial stress on negative symptoms has not been examined. This study provides an initial test of a novel theory proposing that racial stress - here measured by racial discrimination - influences negative symptom severity through exacerbation of negative cognitions about the self, particularly defeatist performance beliefs (DPB). Study DesignParticipants diagnosed with schizophrenia-spectrum disorder (SSD) (N = 208; 80 Black, 128 White) completed the Positive and Negative Syndrome Scale (PANSS), the Defeatist Beliefs Scale, and self-report measures of subjective racial and ethnic discrimination (Racial and Ethnic Minority Scale and General Ethnic Discrimination Scale). Relationships among variables were tested using linear regression and mediation analysis. Study ResultsBlack participants exhibited significantly greater total and experiential negative symptoms than White participants with no group difference in DPB. Racial discrimination explained 46% of the relationship between race and negative symptoms. Among Black participants, higher DPB were associated with greater negative symptom severity. Discrimination was positively related to both DPB and negative symptoms. DPB partially mediated the relationship between discrimination and negative symptoms. ConclusionsFindings suggest that racial stress contributes to negative symptom severity via defeatist beliefs among Black individuals, highlighting potential targets for culturally informed interventions.
Xu, J.; Parker, R. M. A.; Bowman, K.; Clayton, G. L.; Lawlor, D. A.
Show abstract
Background Higher levels of sedentary behaviour, such as leisure screen time (LST), and lower levels of physical activity are associated with diseases across multiple body systems which contribute to a large global health burden. Whether these associations are causal is unclear. The primary aim of this study is to investigate the causal effects of higher LST (given greater power) and, secondarily, lower moderate-to-vigorous intensity physical activity (MVPA), on a wide range of diseases in a hypothesis-free approach. Methods A two-sample Mendelian randomisation phenome-wide association study was conducted for the main analyses. Genetic single nucleotide polymorphisms (SNPs) were first selected as exposure genetic instruments for LST (hours of television watched per day; 117 SNPs) and MVPA (higher vs. lower; 18 SNPs) based on the genome-wide significant threshold (p < 5*10-8) from the largest relevant genome-wide association study (GWAS). For disease outcomes, we used summary results from FinnGen GWAS, including 1,719 diseases defined by hospital discharge International Classification of Diseases (ICD) codes in 453,733 European participants. For the main analyses, we used the inverse-variance weighting method with a Bonferroni corrected p-value of p [≤] 3.47*10-4. Sensitivity analyses included Steiger filtering, MR-Egger and weighted median analyses, and data from UK Biobank were used to explore replication. Findings Genetically predicted higher LST was associated with increased risk of 87 (5.1% of the 1,719) diseases. Most of these diseases were in musculoskeletal and connective tissue (n=37), genitourinary (n=12) and respiratory (n=8) systems. Genetic liability to lower MVPA was associated with six diseases: three in musculoskeletal and connective tissue and genitourinary systems (with greater risk of these diseases also identified with higher LST), and three in respiratory and genitourinary systems. Sensitivity analyses largely supported the main analyses. Results replicated in UK Biobank, where data available. Conclusions Higher levels of sedentary behaviour, and lower levels of physical activity, causally increase the risk of diseases across multiple body systems, making them promising targets for reducing multimorbidity.
Pietilainen, O.; Salonsalmi, A.; Rahkonen, O.; Lahelma, E.; Lallukka, T.
Show abstract
Objectives: Longer lifespans lead to longer time on retirement, despite the efforts to raise the retirement age. Therefore, it is important to study how the retirement years can be spent without diseases. This study examined socioeconomic and sociodemographic differences in healthy years spent on retirement. Methods: We followed a cohort of retired Finnish municipal employees (N=4231, average follow-up 15.4 years) on national administrative registers for major chronic diseases: cancer, coronary heart disease, cerebrovascular disease, diabetes, asthma or chronic obstructive pulmonary disease, dementia, mental disorders, and alcohol-related disorders. Median healthy years on retirement and age at first occurrence of illness (ICD-10 and ATC-based) in each combination of sex, occupational class, and age of retirement were predicted using Royston-Parmar models. Prevalence rates for each diagnostic group were calculated. Results: Most healthy years on retirement were spent by women having worked in semi-professional jobs who retired at age 60-62 (median predicted healthy years 11.6, 95% CI 10.4-12.7). The least healthy years on retirement were spent by men having worked in routine non-manual jobs who retired after age 62 (median predicted healthy years 6.5, 95% CI 4.4-9.5). Diabetes was slightly more common among lower occupational class women, and dementia among manual working women having retired at age 60-62. Discussion: Healthy years on retirement are not enjoyed equally by women and men and those who retire early or later. Policies aiming to increase the retirement age should consider the effects of these gaps on retirees and the equitability of those effects.
Hung, J.; Smith, A.
Show abstract
The global ambition to end the human immunodeficiency virus (HIV) epidemic requires understanding which system-level policy levers, enacted under the framework of Universal Health Coverage (UHC), are most effective in achieving both transmission reduction and diagnostic coverage. This study addresses an important evidence gap by quantifying the within-country association between measurable UHC policy indicators and the estimated rate of new HIV infections across nine Southeast Asian countries between 2013 and 2022. Employing a Fixed-Effects panel data methodology, the analysis controls for time-invariant national heterogeneity, ensuring reliable estimates of policy impact. We found that marginal changes in total current health expenditure (CHE) as a percentage of gross domestic product (GDP) were not statistically significantly associated with changes in HIV incidence. However, increases in the UHC Infectious Disease Service Coverage Index were statistically significantly associated with concurrent reductions in HIV incidence (p < 0.001), suggesting the efficacy of targeted service implementation as the principal driver of curbing new HIV infections. In addition, the UHC Reproductive, Maternal, Newborn, and Child Health Service Coverage Index exhibited a statistically significant positive association with changes in HIV incidence (p < 0.01), which is interpreted as a vital surveillance artefact resulting from expanded detection and reporting of previously undiagnosed HIV cases. Furthermore, out-of-pocket (OOP) health expenditure as a percentage of CHE showed a counter-intuitive negative association with changes in HIV incidence (p < 0.01), suggesting this metric primarily shows ongoing indirect cost burdens on the established patient cohort, or, alternatively, presents a diagnostic access barrier that results in lower case finding. These findings suggest that policymakers should prioritise investment in targeted infectious disease service efficacy over aggregate fiscal commitment and utilise integrated sexual health platforms for strengthened HIV surveillance and case identification.
Hassan, S. S.; Nordqvist-Kleppe, S.; Asinger, N.; Wang, J.; Dillner, J.; Arroyo Muhr, L. S.
Show abstract
Human papillomavirus (HPV) testing is the primary method for cervical cancer screening, and a negative HPV test is associated with a very low subsequent risk of invasive cancer. Nevertheless, a small number of cervical cancers are diagnosed following an HPV-negative testing result, posing challenges within HPV-based screening pathways. Using nationwide Swedish registry data of HPV testing, we identified women diagnosed with invasive cervical cancer between 2019 and 2024 and reconstructed HPV testing histories from the National Cervical Screening Registry (NKCx). The most recent HPV test prior to diagnosis was defined as the index test, and longitudinal HPV testing trajectories were classified among women with an HPV-negative index test. Of 3,000 women diagnosed with invasive cancer, 243 (8.1%) had an HPV-negative index test. These women were older at diagnosis and more frequently diagnosed at advanced stages compared with women with an HPV-positive index test. Most HPV-negative index tests (66.3%) were performed in the peri-diagnostic period (+/- 30 days). Among women with an HPV-negative index test, 52.7% (128/243) had no prior HPV testing recorded, while the remainder had consistently HPV-negative histories (33.3%, 83/243) or evidence of prior HPV positivity before the index negative test (14%, 32/243). Possible recurrent HPV positivity following an intervening negative test was rare (0.4%, 1/243). HPV-negative screening results preceding invasive cancer reflect heterogeneous screening histories and cannot be explained solely by test failure. Findings highlighting the importance of reaching women earlier in screening programs and show that fluctuating HPV detectability is rare.
Xiao, M.; Girard, Q.; Pender, M.; Rabezara, J. Y.; Rahary, P.; Randrianarisoa, S.; Rasambainarivo, F.; Rasolofoniaina, O.; Soarimalala, V.; Janko, M. M.; Nunn, C. L.
Show abstract
PurposeAntibiotic use (ABU) is a major driver of antimicrobial resistance (AMR), but ABU patterns are poorly understood in low-income countries where the burden of AMR is great and ABU is insufficiently regulated. Here, we report ABU from ten sites ranging from rural villages to small cities in Madagascar, a country with high AMR levels, and present results from modeling to identify factors that may be associated with ABU in this setting. MethodsWe conducted surveys of 290 individuals from ten sites in the SAVA Region of northeast Madagascar to gather data on sociodemographic characteristics, agricultural and animal husbandry practices, recent antibiotic use, the antibiotics that participants recalled using in their lifetimes, and the sources of their antibiotics. Using these data, we conducted statistical analyses with a mixed-effects logistic model to determine which characteristics were associated with recent antibiotic use. ResultsNearly all respondents (N=283, 97.6%) reported ABU in their lifetimes, with amoxicillin being the most widely reported antibiotic (N=255, 90.1% of those reporting ABU). All recalled antibiotics were classified as frontline drugs except for ciprofloxacin. Most respondents who reported antibiotic use also reported obtaining antibiotics without prescriptions from local stores (N=273, 96.5%), while only 52.3% (N=148) reported obtaining antibiotics through a prescriptive route, such as from a health clinic or private doctor. Of the 127 individuals (44.9%) who reported recent ABU, men were found to be significantly less likely to have recently taken antibiotics than women. ConclusionsOur findings provide new insights into ABU in agricultural settings in low-income countries, which have historically been understudied in AMR and pharmacoepidemiologic research. Knowledge of ABU patterns supports understanding of AMR dynamics and AMR control efforts in these contexts, such as interventions on inappropriate antibiotic dispensing. Key pointsO_LIAntibiotic use (ABU) in Madagascar is largely unstudied despite its role in antimicrobial resistance (AMR), which Madagascar faces a high burden of. C_LIO_LIABU was widespread among livestock owners in northeast Madagascar, with the majority of study participants reporting ABU in their lifetimes and most people reporting ABU also having taken antibiotics in the previous three months. C_LIO_LIMost respondents reported obtaining their antibiotics from non-pharmaceutical stores, indicating high levels of unregulated ABU, though more than half also reported sourcing their antibiotics through prescriptive means (like doctors and health clinics). C_LIO_LIMen were less likely than women to have taken antibiotics in the previous three months. C_LIO_LIThese findings support the development of interventions to mitigate the burden of AMR in Madagascar and similar contexts while underscoring the need for more comprehensive research on the drivers and patterns of ABU. C_LI Plain language summaryIn this study, we provide basic information on antibiotic use (ABU) patterns in Madagascar, a country that experiences high levels of resistance but has been particularly understudied in AMR and pharmacological research. We surveyed 290 farmers with livestock from ten sites across northeast Madagascar about their ABU and found that nearly all study participants (N=283, 97.6%) have used antibiotics in their lifetimes, while a little under half of those who reported ABU also reported using antibiotics in the previous three months (N=127, 44.9%). The most used antibiotic was amoxicillin (N=255, 90.1%). Most people obtained their antibiotics from sources that do not require prescriptions, like general stores, indicating that most ABU is unregulated. Through modeling, we also found that men were less likely than women to have taken antibiotics in the previous three months (OR=0.50, CI 0.30-0.82). These findings help us better understand the dynamics of ABU in low-income countries, which have historically been understudied in AMR and pharmacological research. They also support efforts to mitigate the burden of AMR by revealing ABU dynamics that may contribute to the emergence and spread of AMR, as well as identifying targets for intervention to curb inappropriate ABU.
Harikumar, A.; Baker, B.; Amen, D.; Keator, D.; Calhoun, V. D.
Show abstract
Single photon emission computed tomography (SPECT) is a highly specialized imaging modality that enables measurement of regional cerebral perfusion and, in particular, resting cerebral blood flow (rCBF). Recent technological advances have improved SPECT quantification and reliability, making it increasingly useful for studying rCBF abnormalities and perfusion network alterations in psychiatric and neurological disorders. To characterize large scale functional organization in SPECT data, data driven decomposition methods such as independent component analysis (ICA) have been used to extract covarying perfusion patterns that map onto interpretable brain networks. Blind ICA provides a data driven approach to estimate these networks without strong prior assumptions. More recently, a hybrid approach that leverages spatial priors to guide a spatially constrained ICA (sc ICA) have been used to fully automate the ICA analysis while also providing participant-specific network estimates. While this has been reliably demonstrated in fMRI with the NeuroMark template, there is currently no comparable SPECT template. A SPECT template would enable automatic estimation of functional SPECT networks with participant-specific expressions that correspond across participants and studies. The current study introduces a new replicable NeuroMark SPECT template for estimating canonical perfusion covariance patterns (networks). We first identify replicable SPECT networks using blind ICA applied to two large sample SPECT datasets. We then demonstrate the use of the resulting template by applying sc-ICA to an independent schizophrenia dataset. In sum, this work presents and shares the first NeuroMark SPECT template and demonstrating its utility in an independent cohort, providing a scalable and robust framework for network-based analyses.
Areb, M.; Huybregts, L.; Tamiru, D.; Toure, M.; Biru, B.; Fall, T.; Haddis, A.; Belachew, T.
Show abstract
BackgroundThis study aimed to assess caregiver knowledge of Infant and Young Child Feeding (IYCF), child health, severe acute malnutrition (SAM) screening, and Community-Based Management of Acute Malnutrition (CMAM), its determinants, and associations with IYCF/ WaSH (water, sanitation, and hygiene) practices among caregivers of children 6-59 months with SAM in Ethiopian agrarian and pastoralist settings. MethodData were from the baseline survey of the R-SWITCH Ethiopia cluster-randomized controlled trial (cRCT), which screened [~]28,000 children aged 6-59 months and identified 686 SAM cases. Caregiver knowledge was evaluated using a validated 32-item questionnaire (Cronbachs for internal reliability) and analyzed via linear mixed-effects and Poisson regression models in Stata 17. ResultsCaregiver knowledge was positively associated with improved IYCF/WaSH practices among children aged 6-23 months with SAM, including higher minimum dietary diversity (MDD: IRR=1.50), minimum acceptable diet (MAD: IRR=1.63), and reduced zero vegetable/fruit intake (IRR=0.77), as well as MDD in children aged 24-59 months, improved water access (IRR=1.19), water treatment (IRR=2.02), and handwashing stations (IRR=1.41). Literate ({beta} = 4.1; 95% CI:1.5-6.6, p= 0.016), pregnant({beta} = 4.4; 95% CI:0.9-7.8, 0.018), having child weighing at a health post/ health center ({beta} = 8.9;95% CI:3.5-14.2,p [≤] 0.001), and higher household wealth index ({beta} = 11.8;95% CI:3.6-20.1,p= 0.005) were associated with higher knowledge, while possible depression ({beta} = -0.3;95% CI: -0.5 to 0.0, p= 0.015) was associated with lower knowledge. ConclusionCaregiver knowledge determines better IYCF/WaSH practices among children aged 6-59 months with SAM. Literacy, pregnancy, having child weighing at a health post or health center, and greater household wealth were associated with caregivers knowledge, whereas possible depression was associated with lower knowledge. Integrating context-specific caregiver education and mental health support into CMAM, GMP(Growth monitoring and promotion), and primary care services could enhance feeding/WaSH practices in Ethiopia.
Heffernan, P. M.; van den Berg, H.; Yadav, R. S.; Murdock, C. C.; Rohr, J. R.
Show abstract
BackgroundInsecticides remain the cornerstone of mosquito vector control for malaria, dengue, and other mosquito-borne diseases, yet global patterns of deployment and their socioeconomic and environmental drivers are poorly characterized. Understanding where and why insecticides are used is essential for better targeting control efforts and ensuring they are effective, equitable, and efficient. MethodsWe analyzed annual country-level insecticide-use data from 122 countries (1990-2019), reported as standard spray coverage for insecticide-treated nets (ITNs), residual spraying (RS), spatial spraying (SS), and larviciding (LA). Generalized linear mixed models and hurdle models quantified associations between deployment and disease incidence, human development index (HDI), human population density, temperature, and precipitation. Models were evaluated using repeated cross-validation and applied to generate downscaled predictions of insecticide use at subnational administrative region level 2 (ADM2) globally. FindingsInsecticide deployment increased with malaria and dengue incidence, but this response was substantially stronger in higher-HDI countries, indicating that deployment depends on socioeconomic capacity as well as disease burden that leads to weaker scaling in lower-resource settings. Intervention types exhibited distinct patterns; ITN use tracked malaria burden, whereas infrastructure-intensive approaches (e.g., RS and SS) were concentrated in higher-HDI settings and increased with Aedes-borne disease incidence. Downscaled ADM2-level maps uncovered substantial within-country heterogeneity that is obscured at the national scale, highlighting regions where predicted deployment remains low relative to disease risk across sub-Saharan Africa, South Asia, and parts of Latin America. InterpretationGlobal insecticide deployment reflects not only epidemiological need but also economic and logistical capacity, creating mismatches between risk and control. High-resolution mapping can support more equitable allocation of interventions, guide insecticide resistance stewardship, and improve strategic planning as climate and urbanization reshape mosquito-borne disease risk.